AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Indonesian Pretraining

# Indonesian Pretraining

Bert Base Indonesian 522M
MIT
A BERT base model pretrained on Indonesian Wikipedia using Masked Language Modeling (MLM) objective, case insensitive.
Large Language Model Other
B
cahya
2,799
25
Indobert Large P2
MIT
IndoBERT is a state-of-the-art language model developed for Indonesian based on the BERT architecture, trained using Masked Language Modeling (MLM) and Next Sentence Prediction (NSP) objectives.
Large Language Model Other
I
indobenchmark
2,272
8
Indonesian Roberta Base
MIT
Indonesian masked language model based on RoBERTa architecture, trained on the OSCAR corpus with a validation accuracy of 62.45%
Large Language Model Other
I
flax-community
1,013
11
Indobert Base P1
MIT
IndoBERT is an advanced Indonesian language model based on BERT, trained with Masked Language Modeling (MLM) and Next Sentence Prediction (NSP) objectives.
Large Language Model Other
I
indobenchmark
261.95k
25
Indobert Lite Base P1
MIT
IndoBERT is a BERT model variant tailored for the Indonesian language, trained using masked language modeling and next sentence prediction objectives. The Lite version is a lightweight model suitable for resource-constrained environments.
Large Language Model Transformers Other
I
indobenchmark
723
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase